Can doctors trust chatgpt? No, but they use it anyway

Monday, August 7, 2023
AI
News
Research published in the July issue of Medical Economics suggests that 10% of US healthcare providers use ChatGPT regularly, and 90% plan to implement the technology for EDM data entry and appointment scheduling. Most physicians who have not used generative AI before end up using it after giving it a try. The use of ChatGPT among patients is also on the rise. Twenty-five percent of those surveyed declared they would rather talk to AI than see a doctor. Experts agree: AI-powered automation will improve productivity, reduce costs, increase the availability of health services, and help reduce bureaucratic burdens. There are also plenty of concerns: reduced human interaction, blind reliance on AI, and security issues regarding transmitted data. But these barriers can't stop the expansion of AI Eighty percent of those who have already used ChatGPT to get medical advice have found it to be an effective alternative. Some patients already believe that in the future… they will not consult a doctor who refuses to use AI. At the same time, more than half of those surveyed admit that artificial intelligence will not replace contact with a doctor.

There are risks, but current medicine isn't perfect either

Every tool used in medicine comes with risks. Humans make mistakes by misdiagnosing; hospital-acquired infections occur despite strict hygiene standards. AI is no exception – it is necessary to conduct a risk-benefit assessment. However, AI is learning fast, and models dedicated to medicine are being developed. Instead of refusing to use AI, doctors should learn how to use it safely and verify the correctness of the answers they receive. Just like with Google, where search results on the first page may be the most popular but not necessarily reliable, AI needs to be applied smartly.

How do doctors use ChatGPT?

According to unofficial data – most often to verify a diagnosis or adjust a medication plan, that is, in routine care and clinical decision-making. Available research shows that today, generative AI can be safely used for creating clinical letters or communicating with health insurers. ChatGPT is well suited for explaining complex issues to patients, creating preventive recommendations, and supporting logistics: scheduling patient flow and doctors' work, creating guides for a website, and answering standard patient questions. Proper use of AI tools can reduce administrative costs by up to half. No one prohibits doctors from asking AI for an opinion. It can be done, but with great caution, while ensuring that personal information is strictly protected. ChatGPT is not a medical device, so the doctors are responsible in the event of an error. One thing is certain when it comes to the development of AI in medicine: ChatGPT and similar solutions will become increasingly common and will become a standard tool in the doctor's office, serving as an administrative and clinical assistant. Every doctor has limited cognitive abilities – and this is true regardless of their experience and knowledge – such an automated assistant can be the perfect aide, taking over all the paperwork that no one likes. If progress in the development of AI remains as rapid as it is now, there is also a good chance that there will be a system similar to ChatGPT, but classified as a medical tool before long. One of the companies working on this is Google. Its large language model Med-PaLM 2 recently went into testing at selected healthcare facilities in the US Tests are also underway for AI solutions to help automatically create electronic medical documentation and draw conclusions by augmenting electronic medical records with insights from scientific papers.

Talking to AI like it's an assistant

A report published in Medical Economics also outlines the future of artificial intelligence in the doctor's office. Experts agree: with AI, the days of typing notes on a keyboard are numbered, and all human interaction with the IT system will be done by voice. According to research conducted by Voicebot.ai, the number of people using voice assistants in healthcare increased from 20 million in mid-2019 to 54.5 million in 2021 (data for the US). The most advanced voice assistants can now process information with 99% or greater accuracy. And the best is yet to come, as the technology is developing rapidly. In the next few months and years, we can expect the introduction of new voice features. Doctors will talk to AI systems, give them instructions and enter data into electronic medical records based on a conversation with the patient. Before the visit begins, doctors will ask for a summary of the patient's medical history, and at the end, they will ask for requests and recommendations to be sent to the patient. A pilot study conducted between 2021 and 2022 by the American Academy of Family Physicians found that the use of a voice-enabled AI assistant led to a 72% decrease in documentation time per month, resulting in an average savings of 3.3 hours per week. Synchronizing the doctor's dictated notes with the EHR – and that's only a fraction of what AI can do so far – saves hours of typing. Doctors can spend the time saved on patient care they didn't have time for. For example, on preventive care or simply on a well-deserved rest instead of doing paperwork after hours. This is what healthcare professionals look forward to most.
How is healthcare shaping its future? Thousands of healthcare professionals are discovering what truly works and seizing opportunities. Claim your ticket and experience it at the ICT&health World Conference 2025!